AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Trainable on a single GPU

# Trainable on a single GPU

Tinymistral 248M GGUF
Apache-2.0
TinyMistral-248M is a small language model based on pre-training from the Mistral 7B model, with parameters scaled down to approximately 248 million, primarily used for fine-tuning downstream tasks.
Large Language Model English
T
afrideva
211
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase